32 research outputs found

    A demonstration of 'broken' visual space

    Get PDF
    It has long been assumed that there is a distorted mapping between real and ‘perceived’ space, based on demonstrations of systematic errors in judgements of slant, curvature, direction and separation. Here, we have applied a direct test to the notion of a coherent visual space. In an immersive virtual environment, participants judged the relative distance of two squares displayed in separate intervals. On some trials, the virtual scene expanded by a factor of four between intervals although, in line with recent results, participants did not report any noticeable change in the scene. We found that there was no consistent depth ordering of objects that can explain the distance matches participants made in this environment (e.g. A > B > D yet also A < C < D) and hence no single one-to-one mapping between participants’ perceived space and any real 3D environment. Instead, factors that affect pairwise comparisons of distances dictate participants’ performance. These data contradict, more directly than previous experiments, the idea that the visual system builds and uses a coherent 3D internal representation of a scene

    Size and shape constancy in consumer virtual reality

    Get PDF
    With the increase in popularity of consumer virtual reality headsets, for research and other applications, it is important to understand the accuracy of 3D perception in VR. We investigated the perceptual accuracy of near-field virtual distances using a size and shape constancy task, in two commercially available devices. Participants wore either the HTC Vive or the Oculus Rift and adjusted the size of a virtual stimulus to match the geometric qualities (size and depth) of a physical stimulus they were able to refer to haptically. The judgments participants made allowed for an indirect measure of their perception of the egocentric, virtual distance to the stimuli. The data show under-constancy and are consistent with research from carefully calibrated psychophysical techniques. There was no difference in the degree of constancy found in the two headsets. We conclude that consumer virtual reality headsets provide a sufficiently high degree of accuracy in distance perception, to allow them to be used confidently in future experimental vision science, and other research applications in psychology

    Vestibular Facilitation of Optic Flow Parsing

    Get PDF
    Simultaneous object motion and self-motion give rise to complex patterns of retinal image motion. In order to estimate object motion accurately, the brain must parse this complex retinal motion into self-motion and object motion components. Although this computational problem can be solved, in principle, through purely visual mechanisms, extra-retinal information that arises from the vestibular system during self-motion may also play an important role. Here we investigate whether combining vestibular and visual self-motion information improves the precision of object motion estimates. Subjects were asked to discriminate the direction of object motion in the presence of simultaneous self-motion, depicted either by visual cues alone (i.e. optic flow) or by combined visual/vestibular stimuli. We report a small but significant improvement in object motion discrimination thresholds with the addition of vestibular cues. This improvement was greatest for eccentric heading directions and negligible for forward movement, a finding that could reflect increased relative reliability of vestibular versus visual cues for eccentric heading directions. Overall, these results are consistent with the hypothesis that vestibular inputs can help parse retinal image motion into self-motion and object motion components

    Effect of Pictorial Depth Cues, Binocular Disparity Cues and Motion Parallax Depth Cues on Lightness Perception in Three-Dimensional Virtual Scenes

    Get PDF
    Surface lightness perception is affected by scene interpretation. There is some experimental evidence that perceived lightness under bi-ocular viewing conditions is different from perceived lightness in actual scenes but there are also reports that viewing conditions have little or no effect on perceived color. We investigated how mixes of depth cues affect perception of lightness in three-dimensional rendered scenes containing strong gradients of illumination in depth.Observers viewed a virtual room (4 m width x 5 m height x 17.5 m depth) with checkerboard walls and floor. In four conditions, the room was presented with or without binocular disparity (BD) depth cues and with or without motion parallax (MP) depth cues. In all conditions, observers were asked to adjust the luminance of a comparison surface to match the lightness of test surfaces placed at seven different depths (8.5-17.5 m) in the scene. We estimated lightness versus depth profiles in all four depth cue conditions. Even when observers had only pictorial depth cues (no MP, no BD), they partially but significantly discounted the illumination gradient in judging lightness. Adding either MP or BD led to significantly greater discounting and both cues together produced the greatest discounting. The effects of MP and BD were approximately additive. BD had greater influence at near distances than far.These results suggest the surface lightness perception is modulated by three-dimensional perception/interpretation using pictorial, binocular-disparity, and motion-parallax cues additively. We propose a two-stage (2D and 3D) processing model for lightness perception

    Being Barbie: The Size of One’s Own Body Determines the Perceived Size of the World

    Get PDF
    A classical question in philosophy and psychology is if the sense of one's body influences how one visually perceives the world. Several theoreticians have suggested that our own body serves as a fundamental reference in visual perception of sizes and distances, although compelling experimental evidence for this hypothesis is lacking. In contrast, modern textbooks typically explain the perception of object size and distance by the combination of information from different visual cues. Here, we describe full body illusions in which subjects experience the ownership of a doll's body (80 cm or 30 cm) and a giant's body (400 cm) and use these as tools to demonstrate that the size of one's sensed own body directly influences the perception of object size and distance. These effects were quantified in ten separate experiments with complementary verbal, questionnaire, manual, walking, and physiological measures. When participants experienced the tiny body as their own, they perceived objects to be larger and farther away, and when they experienced the large-body illusion, they perceived objects to be smaller and nearer. Importantly, despite identical retinal input, this “body size effect” was greater when the participants experienced a sense of ownership of the artificial bodies compared to a control condition in which ownership was disrupted. These findings are fundamentally important as they suggest a causal relationship between the representations of body space and external space. Thus, our own body size affects how we perceive the world

    Distorted body representations are robust to differences in experimental instructions

    Get PDF
    Several recent reports have shown that even healthy adults maintain highly distorted representations of the size and shape of their body. These distortions have been shown to be highly consistent across different study designs and dependent measures. However, previous studies have found that visual judgments of size can be modulated by the experimental instructions used, for example, by asking for judgments of the participant’s subjective experience of stimulus size (i.e., apparent instructions) versus judgments of actual stimulus properties (i.e., objective instructions). Previous studies investigating internal body representations have relied exclusively on ‘apparent’ instructions. Here, we investigated whether apparent versus objective instructions modulate findings of distorted body representations underlying position sense (Exp. 1), tactile distance perception (Exp. 2), as well as the conscious body image (Exp. 3). Our results replicate the characteristic distortions previously reported for each of these tasks and further show that these distortions are not affected by instruction type (i.e., apparent vs. objective). These results show that the distortions measured with these paradigms are robust to differences in instructions and do not reflect a dissociation between perception and belief

    Size-Distance Transformations

    No full text
    corecore